3 research outputs found

    SONEX: An Evaluation Exchange Framework for Reproducible Sonification

    Get PDF
    Degara N, Nagel F, Hermann T. SONEX: An Evaluation Exchange Framework for Reproducible Sonification. In: Strumiłło P, Bujacz M, Popielata M, eds. Proceedings of the 19th International Conference on Auditory Displays. Lodz, Poland: Lodz University of Technology Press; 2013: 167-174.After 18 ICAD conferences, Auditory Display has become a mature research community. However, a robust evaluation and scientific comparison of sonification methods is often neglected by auditory display researchers. In the last ICAD 2012 conference, only one paper out of 53 makes a statistical comparison of several sonification methods and still no comparison with other state-of-the-art algorithms is provided. In this paper, we review profitable standards in other communities and transfer them to derive recommendations and best practices for auditory display research. We describe SonEX (Sonification Evaluation eXchange), a community-based framework for the formal evaluation of sonification methods. The goals, challenges and architecture of this evaluation platform are discussed. In addition, a simple example of a task definition according to the guidelines of SonEX is also introduced. This paper aims at starting a vivid discussion towards the establishment of thorough scientific methodologies for auditory display research and the definition of standardized sonification tasks

    InfoDrops: Sonification for Enhanced Awareness of Resource Consumption in the Shower

    Get PDF
    Hammerschmidt J, Tünnermann R, Hermann T. InfoDrops: Sonification for Enhanced Awareness of Resource Consumption in the Shower. In: Strumiłło P, Bujacz M, Popielata M, eds. ICAD 2013 - Proceedings of the International Conference on Auditory Display. Łódź, Poland: Lodz University of Technology Press; 2013: 57-64.Although most of us strive to develop a sustainable and less resource-intensive behavior, this unfortunately is a difficult task, because often we are unaware of relevant information, or our focus of attention lies elsewhere. Based on this observation, we present a new approach for an unobtrusive and affective ambient auditory information display to become and stay aware of water and energy consumption while taking a shower. Using the interaction sound of waterdrops falling onto the bathtub as a carrier for information, our system supports users to be in touch with resource-related variables. We explore the usage of an affective dimension as an additional layer of information and introduce our 4/5-factor approach to adapt the auditory display's output so that it supports a slow but steady adjustment of the personal showering habit over time. We present and discuss several alternative sound and interaction designs

    Interactive Sonification of Collaborative AR-based Planning Tasks for Enhancing Joint Attention

    Get PDF
    Neumann A, Hermann T. Interactive Sonification of Collaborative AR-based Planning Tasks for Enhancing Joint Attention. In: Strumiłło P, Bujacz M, Popielata M, eds. Proceedings of the 19th International Conference on Auditory Displays. The International Community for Auditory Display (ICAD); 2013: 49-55.This paper introduces a novel sonification-based interaction support for cooperating users in an Augmented Reality setting. When using head-mounted AR displays, the field of view is limited which causes users to miss important activities such as object interactions or deictic references of their interaction partner to (re-)establish joint attention. We introduce an interactive sonification which makes object manipulations of both interaction partners mutually transparent by sounds that convey information about the kind of activity, and which can optionally even identify the object itself. In this paper we focus on the sonification method, interaction design and sound design, and we furthermore render the sonification both from sensor data (e.g. object tracking) and manual annotations. As a spin-off of our approach we propose this method further for the enhancement of interaction observation, data analysis, and multimodal annotation in interactional linguistics and conversation analysis
    corecore